PAC-Bayesian Theorems for Gaussian Process Classification
نویسنده
چکیده
We present distribution-free generalization error bounds which apply to a wide class of approximate Bayesian Gaussian process classification (GPC) techniques, powerful nonparametric learning methods similar to Support Vector machines. The bounds use the PACBayesian theorem [8] for which we provide a simplified proof, leading to new insights into its relation to traditional VC type union bound techniques. Experiments on the MNIST database show that our bounds can be very tight for moderate training sample sizes. Our proofs require only elementary concepts of convexity well-known in Machine Learning and information theory, and the bounds give a strong learning-theoretical motivation for the use of Bayesian GPC techniques.
منابع مشابه
PAC-Bayesian Generalisation Error Bounds for Gaussian Process Classification
Approximate Bayesian Gaussian process (GP) classification techniques are powerful nonparametric learning methods, similar in appearance and performance to support vector machines. Based on simple probabilistic models, they render interpretable results and can be embedded in Bayesian frameworks for model selection, feature selection, etc. In this paper, by applying the PAC-Bayesian theorem of Mc...
متن کاملPAC-Bayesian Generalization Error Bounds for Gaussian Process Classification
Approximate Bayesian Gaussian process (GP) classification techniques are powerful nonparametric learning methods, similar in appearance and performance to Support Vector machines. Based on simple probabilistic models, they render interpretable results and can be embedded in Bayesian frameworks for model selection, feature selection, etc. In this paper, by applying the PAC-Bayesian theorem of nc...
متن کاملPAC-Bayesian AUC classification and scoring
We develop a scoring and classification procedure based on the PAC-Bayesian approach and the AUC (Area Under Curve) criterion. We focus initially on the class of linear score functions. We derive PAC-Bayesian non-asymptotic bounds for two types of prior for the score parameters: a Gaussian prior, and a spike-and-slab prior; the latter makes it possible to perform feature selection. One importan...
متن کاملPAC-Bayesian Bound for Gaussian Process Regression and Multiple Kernel Additive Model
We develop a PAC-Bayesian bound for the convergence rate of a Bayesian variant of Multiple Kernel Learning (MKL) that is an estimation method for the sparse additive model. Standard analyses for MKL require a strong condition on the design analogous to the restricted eigenvalue condition for the analysis of Lasso and Dantzig selector. In this paper, we apply PAC-Bayesian technique to show that ...
متن کاملPAC-Bayesian Bounds based on the Rényi Divergence
We propose a simplified proof process for PAC-Bayesian generalization bounds, that allows to divide the proof in four successive inequalities, easing the “customization” of PAC-Bayesian theorems. We also propose a family of PAC-Bayesian bounds based on the Rényi divergence between the prior and posterior distributions, whereas most PACBayesian bounds are based on the KullbackLeibler divergence....
متن کامل